专利摘要:
The system (1) comprises a drone (D) provided with a camera (C) and a ground station, the camera (C) being oriented along a line of sight (3), the drone (D) being adapted to move autonomously to take an animated shooting of a target (T) moving with the ground station, the orientation of the line of sight (3) being such that the target (T) remains present in the successive images produced by said shooting. The system further comprises means (6) for determining the velocity vector (VT1, VT2) of the target (T) and the position (TP1, TP2) of the target (T) in a given reference frame, and means control system (2) configured to generate flight instructions from the determined velocity vector (VT1, VT2), the determined position (TP1, TP2) and a predetermined directional angle (αp), so as to maintain the angle between the line of sight (3) of the camera (C) and the direction of the velocity vector (VT1, VT2) substantially to the value of said predetermined directional angle (αp).
公开号:FR3056921A1
申请号:FR1659605
申请日:2016-10-05
公开日:2018-04-06
发明作者:Edouard LEURENT
申请人:Parrot Drones SAS;
IPC主号:
专利说明:

056 921
59605 ® FRENCH REPUBLIC
NATIONAL INSTITUTE OF INDUSTRIAL PROPERTY © Publication number:
(to be used only for reproduction orders)
©) National registration number
COURBEVOIE © Int Cl 8 : A 63 H27 / 133 (2017.01), A 63 H 30/04, H 04 M 1/72, G 08 C 17/02, H 04 N 5/232
A1 PATENT APPLICATION
©) Date of filing: 05.10.16. (© Applicant (s): PARROT DRONES Company by (© Priority: simplified actions - FR. @ Inventor (s): LEURENT EDOUARD. ©) Date of public availability of the request: 06.04.18 Bulletin 18/14. ©) List of documents cited in the report preliminary research: Refer to end of present booklet (© References to other national documents ©) Holder (s): PARROT DRONES Joint-stock company related: simplified. ©) Extension request (s): © Agent (s): BARDEHLE PAGENBERG.
SELF-CONTAINED DRONE-BASED SHOOTING SYSTEM WITH TARGET TRACKING AND HOLDING THE TARGET ANGLE.
FR 3 056 921 - A1 yf) The system (1) comprises a drone (D) provided with a camera (C) and a ground station, the camera (C) being oriented along a line of sight (3), the drone (D) being able to move autonomously to carry out an animated shooting of a target (T) moving with the station on the ground, the orientation of the line of sight (3) being such that the target (T) remains present in the successive images produced by said shot. The system further comprises means (6) for determining the speed vector (VT 1, VT2) of the target (T) and the position (TP1, TP2) of the target (T) in a given coordinate system, and control means (2) configured to generate flight instructions from the determined speed vector (VT1, VT2), the determined position (TP1, TP2) and a predetermined directional angle (oc p ), so as to maintain the angle between the line of sight (3) of the camera (C) and the direction of the speed vector (VT1, VT2) substantially at the value of said predetermined directional angle (OCp).
The invention relates to remote-controlled flying motorized devices, hereafter generally designated by the name of drones. The invention is particularly applicable to rotary-wing drones such as quadricopters, a typical example of which is the Bebop 2 from Parrot SA, Paris, France, which is a drone equipped with a series of sensors (accelerometers, gyrometers three axes, altimeter), a front camera capturing an image of the scene towards which the drone is aimed, and a vertical aiming camera capturing an image of the terrain overflown.
The rotary wing drones are provided with multiple rotors driven by respective motors controllable in a differentiated manner in order to pilot the drone in attitude and speed.
Documents WO 2010/061099 A2 and EP 2 364 757 A1 (Parrot SA) describe such a drone as well as its principle of piloting via a station, generally on the ground, such as a multimedia telephone or portable music player. integrated touch screen and accelerometer, for example an iPhone type cell phone or an iPad type multimedia tablet (registered trademarks). These stations incorporate the various control bodies necessary for the detection of piloting commands and for the bidirectional exchange of data with the drone via a wireless link of the local Wi-Fi network type (IEEE 802.11) or Bluetooth. They are also provided with a touch screen displaying the image captured by the front camera of the drone, with a number of symbols superimposed allowing the activation of commands by simple touch of the user's finger on this touch screen.
The drone's front video camera can be used to capture sequences of images of a scene towards which the drone is turned. The user can thus use the drone in the same way as a camera or camcorder which, instead of being held in the hand, would be carried by the drone. The images collected can be saved and then broadcast, posted on websites hosting video sequences, sent to other Internet users, shared on social networks, etc.
The front camera can be an orientable camera, in order to orient in a controlled manner in a predetermined direction the line of sight, and therefore the field of images transmitted with the video stream. A technique implemented in particular in the aforementioned Bebop 2 device and described in the
EP 2 933 775 A1, consists in using a high-definition wide-angle camera fitted with a hemispherical field lens of the fish-eye type covering a field of approximately 180 ° and in windowing in time the raw image delivered by this sensor, by software processing ensuring the selection of the useful pixels of the raw image in a capture zone determined as a function of a certain number of parameters, including pointing commands in the direction of a particular target chosen by the user or automatically followed by the drone. As a variant, or even as a complement, to control the line of sight of the camera by windowing software, it is also possible to mount the camera on a three-axis articulated support of gimbal type with Cardan suspension, fitted servomotors controlled according to gyrometric data and pointing commands. The invention obviously applies to any type of camera, orientable or not, and whatever its pointing mode.
In a so-called tracking mode, the drone can be programmed to follow a moving target whose coordinates are known and so that during its flight, the camera's line of sight is directed towards said target. This target is typically the station itself, carried by a user who may be in motion (for example practicing a sport where he is moving - running, sliding, driving, etc.). In this mode, the drone is able to film the changes in the user without having to act on the movements of the drone and on the line of sight of the camera.
The drone following the target object adjusts its position and / or the position of the camera unit so that the target object is always filmed by the drone. The drone being autonomous, that is to say the displacement is calculated by the drone and not piloted by a user, it determines its trajectory according to the movements of the target object and controls the camera block so that it is always in the direction of the target object to be filmed. Thus, when the target moves, the drone is not only capable of chasing the target, but also it positions itself so as to orient the camera so that its line of sight points towards the target.
For this purpose, the coordinates of the ground station, obtained by a GPS central unit equipping it in a manner known per se, are communicated to the drone by the wireless link, and the drone can therefore adjust its movements so as to follow the target and keep the camera's line of sight towards the target, so that the image remains framed on the subject.
In tracking mode, it is known that the drone follows the movement of the target which is in the field of vision of the camera. Thus, when the target makes a trajectory which moves away from the drone, the latter detects the movement of the target and in turn moves towards the target so that it remains in the field of vision of the camera. In this process, the drone comes to position itself behind the target to follow its distance. We thus obtain images with a shot on the back of the target. In addition, the angle of view of the images is changed as the target moves. Indeed, when the target makes a turn, while the drone follows it, the drone is behind or in front of the target depending on the direction of the turn. As a result, the camera's angle of view from the target path is very different from that before the turn.
An object of the invention is to propose a system allowing a drone, in the autonomous shooting mode of a moving target, on the one hand to maintain the same angle of shooting of the target during the pursuit, and on the other hand to maintain the relative positioning of the drone around the target.
To this end, the invention provides a system for taking animated shots, comprising a drone provided with a camera and a ground station communicating by a wireless link with the drone, the camera being oriented along an aiming axis, the movements of the drone being defined by flight instructions applied to a propulsion unit or a set of propulsion units of the drone, the drone being able to move autonomously to carry out an animated shooting of a target moving with the station at ground, the orientation of the line of sight being such that the target remains present in the successive images produced by said shooting.
Characteristically of the invention, the system comprises:
means for determining the target speed vector and the position of the target in a given frame of reference, and
- control means configured to generate said flight instructions from:
i. the determined velocity vector, ii. the determined position, and iii. by a predetermined directional angle, so as to maintain the angle between the line of sight of the camera and the direction of the speed vector substantially at the value of said predetermined directional angle (a p ).
The following features can be taken together or separately.
Preferably, the system includes means for activating target tracking by the drone capable of:
- to order activation of target tracking, and
- calculating the value of said predetermined directional angle at the time of said activation.
Said flight instructions generated by the control means can be generated from a servo loop on a command to maintain said predetermined directional angle.
Preferably also, the control means are configured to generate said flight instructions to further control the movement of the drone at a predetermined distance between the drone and the target.
In a preferred form of implementation, the means for activating the tracking of the target by the drone are further able to calculate the value of said predetermined distance at the time of said activation.
Said flight instructions generated by the control means can also be generated from a servo loop on an instruction to maintain said predetermined distance.
Preferably, the control means are configured to generate said flight instructions to further control the movement of the drone so as to maintain a predetermined elevation angle, the predetermined elevation angle being an angle between the line of sight. of the camera and a horizontal plane.
In a preferred embodiment, the means for activating the tracking of the target by the drone are further able to calculate the value of said predetermined elevation angle at the time of said activation.
Said flight instructions generated by the control means can also be generated from a servo loop on a command to maintain said predetermined elevation angle.
In a particular embodiment, the direction of the line of sight of the camera is fixed relative to a main axis of the drone, the control means being configured to generate flight instructions so as to direct the line of sight from the camera to the target when the target is followed by the drone.
In another particular embodiment, the direction of the line of sight of the camera can be modified relative to a main axis of the drone by means of modification, the modification means being configured to direct at least partly the line of sight from the camera to the target when the target is followed by the drone.
Said means for determining the speed vector and the position of the target can operate by observing the successive GPS geographic positions of the target, the given reference being a terrestrial reference.
Said means for determining the speed vector and the position of the target can operate by analyzing the images delivered by the drone camera, the reference given being a reference linked to the drone.
In this case, the analysis of the images delivered by the camera is preferably an analysis of the position of the target in the images generated successively by the camera of the drone, and the system comprises means for locating and tracking said position in the images. successive.
We will now describe an example of implementation of the present invention, with reference to the accompanying drawings where the same references designate from one figure to another identical or functionally similar elements.
Figure 1 is a schematic overview of a camera system comprising a drone and a ground station.
Figure 2 is a schematic representation of a top view of the system of Figure 1 according to the invention, the target and the drone being each represented in an initial position and in a subsequent position.
FIG. 3 is a schematic representation of the means implemented in the system of FIG. 1.
Figure 4 is a schematic representation of a side view of the system of Figure 1 according to the invention, the target and the drone being each represented in an initial position and in a subsequent position.
The invention applies to a drone D, for example a quadricopter type drone, such as the Parrot Bebop 2 Drone, the various technical aspects of which are described in EP2 364 757A1, EP2613213A1,
EP 2 450 862 A1 or even EP 2 613 214 A1 mentioned above.
The invention relates to a system 1 for moving pictures, comprising a drone D provided with a camera C and a ground station S communicating by a wireless link with the drone D, shown in FIG. 1.
The drone D comprises a propulsion unit or a set of propulsion units comprising coplanar rotors, the motors of which are controlled independently by an integrated navigation and attitude control system. It is provided with a front view camera C making it possible to obtain an image of the scene towards which the drone is oriented. The camera C is oriented along a view axis 3, as shown in FIG. 2. Inertial sensors (accelerometers and gyrometers) make it possible to measure with a certain precision the angular speeds and the angles of attitude of the drone, that is to say the angles of Euler (pitch, roll and yaw) describing the inclination of the drone by relative to a horizontal plane of a fixed terrestrial reference. An ultrasonic rangefinder placed under the drone D also provides a measurement of the altitude relative to the ground. The drone D is also provided with location means making it possible to determine its absolute position DP1, DP2 in space, in particular from data coming from a GPS receiver.
The drone D is controlled by the ground station S, typically in the form of a remote control, for example of the model aircraft remote control type, a smart phone or a smart tablet. The smart phone or smart tablet is provided with a touch screen E displaying the image on board by the front camera C, with a number of symbols superimposed allowing the activation of control commands by simple touch of the finger of a user on the touch screen E. When the drone D is piloted by a remote control type station S, the user can be provided with immersion piloting glasses, often called FPV glasses (First Person View). The station S is also provided with radio link means with the drone D, for example of the local Wi-Fi network type (IEEE 802.11), for the bidirectional exchange of data from the drone D to the station S, in particular for the transmission of the image captured by camera C and flight data, and from station S to drone D for sending piloting commands.
The system constituted by the drone D and the station S, is configured to be able to give the drone the ability to follow and film a target autonomously. Typically, the target is constituted by the station S itself carried by the user.
According to the invention, the tracking of the target by the drone is carried out while maintaining the same angle of view of the target by the camera C of the drone D. The movements of the drone D are defined by flight instructions generated by means for controlling the navigation system of the drone D, and applied to the powertrain or to the set of powertrains of the drone D.
According to the invention illustrated in Figures 2 and 3, to maintain the same angle of view of the target on the successive images, the instructions of the control means 2 are generated so as to maintain a predetermined directional angle oc p formed between the aiming axis 3 of camera C and the direction of the speed vector VT1, VT2 of target T. This angle corresponds substantially to the angle of view of the target by camera C of the drone D. It is determined to at a given time, a value, for example fixed, of the predetermined directional angle ct p , which is appreciably maintained in the mode of tracking the target T by the drone D. In other words, the drone D follows a displacement as a function of the displacement of the target T so that the current directional angle a is substantially equal to the value of the predetermined directional angle oc p during the respective movements of the target T and of the drone D. Thus, the predetermined directional angle oc p is the angle at which o n wishes to take the continuous shooting of the target T. The value of the predetermined directional angle oc p can according to another embodiment be chosen from a set of values prerecorded in the system 1.
To this end, the control means 2 of the system are configured to generate said flight instructions from:
i. of the speed vector VT 1, VT2 of the target T, ii. the target position TP1, TP2, and iii. of a predetermined directional angle a p .
The orientation of the line of sight 3 of the drone camera is such that the target T remains present on the successive images produced by said shooting.
In a first embodiment, the direction of the aiming axis 3 of the camera C is fixed relative to a main axis of the drone. The control means 2 are therefore configured to generate flight instructions so as to position the main axis of the drone so that the line of sight 3 of the camera C is directed towards the target T during the tracking of the target T by drone D.
In a second embodiment, the direction of the aiming axis 3 of the camera C can be modified relative to a main axis of the drone by means of modification. The modification means are configured to direct at least partially the line of sight 3 of the camera towards the target T during the tracking of the target by the drone D. The camera C is for example a fixed camera of the fish-eye type with hemispherical field, as described for example in EP 2 933 775 A1 (Parrot). With such a camera, the changes in the aiming axis 3 of the camera C are made not by physical movement of the camera, but by cropping and reprocessing of the images taken by the camera as a function of a virtual aiming angle, determined relative to the main axis of the drone, given as a setpoint. Camera C can also be a mobile camera as3056921 similar to a drone, for example under the body of the drone, in this case, the modification means include motors to rotate the camera along at least one of the three axes, or even the three axes in order to orient the line of sight of the camera so that the target remains present in the successive images produced by said shot.
The coordinates of the position TP1, TP2 of the target T make it possible to determine the orientation of the aiming axis 3 of the camera C so that the target T remains present on the successive images produced during the shooting. The coordinates of the line of sight 3 of camera C are determined using the drone's sensors, which determine the position of the drone
D.
The coordinates of the speed vector VT1, VT2 and the position TP1, TP2 of the target T relative to the drone D make it possible to determine the current directional angle a between the line of sight 3 of the camera C and the direction of the vector- speed VT 1, VT2.
The control means 2 are for example configured to generate said flight instructions from a servo loop on a command to maintain said predetermined directional angle ct p , for example using a calculation unit provided with '' an execution program provided for this purpose. The principle of servo-control is to continuously measure the difference between the current value of the quantity to be servo-controlled and the predetermined value which it is desired to reach, in order to determine the appropriate control instructions to reach the predetermined value. Thus, the control means 2 first determine the current directional angle a, then give flight instructions so that the drone D moves in a position DP2 in which the current directional angle a corresponds to the predetermined directional angle a p . The control loop is repeated continuously by the control means 2 to maintain the value of the predetermined directional angle a p . Referring to Figure 2, there is shown schematically the drone D in autonomous movement, equipped with the camera C which performs a sequence of animated images of the target T. The target T has an initial position TP1 and the drone D has a initial position DP1, defined in the terrestrial frame. Target D moves with a speed vector VT1 at position DP1, and VT2 at position DP2, the direction and the value changing over time. In the initial position DP1, the axis of the camera C is directed towards the target T and forms a directional angle with the direction of the speed vector VT1, which corresponds to the predetermined directional angle a p . In the same figure 2, the target is shown in a later position TP2 and the drone in a later position DP2. The target T passes from the initial position TP1 to the subsequent position TP2. In the tracking mode, the drone D moves from the initial position DP1 to the subsequent position DP2, thanks to flight instructions generated by the control means. The flight instructions are defined so as to keep the same predetermined directional angle oc p as that of the initial positions. Thus, in their respective subsequent positions TP2, DP2, it is observed that the directional angle formed between the line of sight 3 of the camera C and the direction of the speed vector VT2 is substantially identical to that which was defined in the initial positions TP1, DP1 of the target T and the drone D. Thus, thanks to the system 1 of the invention, the angle of view of the target T by the camera C remains the same despite the displacements of the target T and the drone D.
As shown in FIGS. 2 and 3, and in order to allow the control means 2 to calculate the current directional angle a, the system 1 also comprises means 6 for determining the position TP1, TP2 and the speed vector VT1, VT2 of the target T in a given coordinate system. The determination means 6 transmit the coordinates of the speed vector VT1, VT2 and of the position TP1, TP2 of the target T to the control means 2. The determination means 6 determine these coordinates repeatedly to transmit values to the control means 2 updated.
In a first embodiment, said means 6 for determining the speed vector VT1, VT2 and the position of the target T operate by observing the successive GPS geographic positions of the target T. The given frame of reference for determining the speed vector VT1, VT2 and the position TP1, TP2 of the target T is therefore a terrestrial reference. The determination means 6 receive the successive GPS positions of the target T over time. The determination means 6 can thus deduce therefrom the coordinates of the speed vector VT1, VT2 of the target T. The position of the target T is given by the GPS coordinates of the ground station.
According to a first variant of this first embodiment, the determination means 6 are arranged in the drone D. The GPS positions of the target T are transmitted by the target T to the determination means 6 of the drone D.
According to a second variant of this first embodiment, the determination means 6 are arranged in the ground station of the target T. Here, the coordinates of the speed vector VT1, VT2 and of the position TP1, TP2 of the target T are determined in the ground station, then transmitted to drone D.
In a second embodiment, said means 6 for determining the speed vector VT1, VT2 and the position of the target T, operate by analyzing the images delivered by the camera C of the drone T. The reference given here is a linked reference to the drone D. In this case, the analysis of the images delivered by the camera C is an analysis of the position TP1, TP2 of the target T in the images successively generated by the camera C of the drone D. The determination means 6 comprise means for locating and monitoring the position TP1, TP2 of the target T in the successive images. In this particular embodiment, the determination means 6 are located in the drone D. For this purpose, an image analysis program provided in the determination means 6 on board the drone D, or in a dedicated circuit, is configured to follow the displacement of the target T in the sequence of images generated by the camera C, and to deduce therefrom in which angular direction the target T is located relative to the line of sight 3 of the camera C. More precisely , this program is configured to locate and follow in successive images a visual pattern or a spot of color representative of the visual aspect of the target with respect to a background (for example a pattern developed by analysis of the gray levels of the picture). For example, for the shooting of a target user practicing a snow sport, the background will generally be white and the color of the spot in the images will be that of the user's clothes. This approach also makes it possible to have angular position data of the user to be followed at a rate substantially faster than that with which the GPS coordinates are delivered (in general 1 time per second), namely the frame rate which is typically 30 frames per second for this type of application.
In this second embodiment, the image analysis is associated with another measuring means which provides at least in part a geographical position TP1, TP2 of the target T. These means can in particular come from the GPS unit of the ground station, or a barometric type pressure sensor, arranged in the ground station of the target T. The means for determining the line of sight 3 of the camera C being capable of indicating an angular position of the target T relative to the main axis of the drone, are thus supplemented by taking into account a geographical signal. The on-board electronics of the drone is capable of knowing by cross-checking between the geographic data and the angular detection data, the position of the target T. In this way, very precise coordinates of the position TP1, TP2 and of the speed vector are obtained. VT1, VT2 of target T.
According to a particular embodiment, the control means 2 are further configured to generate said flight instructions to control the movement of the drone D at a predetermined distance d p between the drone D and the target T. In other words, in the mode the distance d between the target T and the camera C is also maintained, in addition to maintaining the angle of view of the camera C. The predetermined distance dp has a fixed value during the monitoring. Thus, the perception of the dimensions of the target T remains substantially the same during the shooting, with a constant focal length of camera C. The current distance d between the target T and the camera C is calculated by the control means 2, from the position of the target TP1, TP2 determined by the determination means 6 and by the position of the drone DP1, DP2 determined by its inertial sensors.
The control means 2 are for example configured to generate the flight instructions from a servo loop on a command to maintain said predetermined distance d p . The process is similar to that concerning the control loop on the instruction to maintain the predetermined directional angle a p . The control means 2 calculate the current distance d and generate instructions for moving the drone into a position DP2, the distance to the target T of which corresponds to the predetermined distance d p . Thus, the distance between the subsequent positions TP2, DP2 of the drone D and of the target T in FIG. 3 is substantially identical to the distance between the initial positions TP1, DP1 of the drone D and of the target T.
According to a particular embodiment, shown in Figures 2 and 4, the control means 2 are further configured to generate flight instructions to control the movement of the drone D so as to maintain a predetermined elevation angle β ρ defined between the line of sight 3 of camera C and the horizontal plane π. This predetermined elevation angle β ρ makes it possible to determine the relative altitude of the drone relative to the target. By maintaining a predetermined elevation angle β ρ constant, the drone D maintains its altitude relative to the target T. The horizontal plane π is defined with respect to the terrestrial reference, and can be defined at any altitude. The value of the current elevation angle β is determined by the control means 2 as a function of the line of sight 3 of the camera C and of the horizontal plane π.
The control means 2 are for example configured to generate the flight instructions from a servo loop on a command to maintain said predetermined elevation angle β ρ . The method is similar to that concerning the servo loop on the instruction to maintain the predetermined directional angle oc p and that of the predetermined distance d p between the camera C and the target T. Thus, as shown in FIG. 4 , the current elevation angle β relative to the horizontal plane π has the same value between the initial positions TP1, DP1 of the target T and of the drone D, and the subsequent positions TP2, DP2.
In a particular embodiment, the control means are configured to generate flight instructions making it possible to modify the position of the drone to simultaneously modify the current directional angle a, the current distance d between the camera C and the target T and l 'current elevation angle β to reach the three corresponding predetermined values.
In the embodiment shown in FIG. 2, the system 1 comprises activation means 7 for tracking the target by the drone. The follow-up activation means 7 are able to control the activation and deactivation of target tracking T. The activation means are for example arranged in the ground station so that they can be easily activated by the user, and include for example an activation button. So when the user presses the button, the drone goes into target tracking mode.
According to a particular embodiment of the invention, the activation means 7 for tracking the target by the drone are capable of calculating at least the value of said predetermined directional angle at the time of said activation. In other words, the activation means 7 define the value of the predetermined directional angle oc p which will be maintained during the tracking of the target by the drone. The activation means 7 are for example configured so that the value of the predetermined directional angle is the current directional angle at the time of activation, in particular at the time when the button is actuated.
The activation means 7 calculate the current directional angle as a function of the coordinates of the speed vector and the position of the target, which are transmitted to it by the determination means 6. Thus, the user positions the drone and the camera according to a viewing angle that he wishes to maintain, and activates the tracking mode thanks to the activation means 7 so that the drone follows the target while maintaining the chosen viewing angle.
According to a particular embodiment, the activation means 7 for tracking the target by the drone are also capable of calculating the value of the predetermined distance d p between the target and the drone at the time of said activation. Similarly, the activation means 7 are for example able to calculate the value of said predetermined elevation angle β ρ at the time of said activation. The predetermined values are transmitted to the control means 2 which store them in a memory. Thus, when the activation means 7 are actuated, the values of the three predetermined parameters are calculated by the activation means 7, then are maintained during the monitoring of the target by the control means 2 as long as the user n did not deactivate tracking of target T.
权利要求:
Claims (14)
[1" id="c-fr-0001]
1. Animated shooting system, comprising a drone (D) provided with a camera (C) and a ground station (S) communicating by a wireless link with the drone, the camera (C) being oriented in a line of sight (3), the movements of the drone (D) being defined by flight instructions applied to a propulsion unit or a set of propulsion units of the drone (D), the drone (D) being able to move so autonomous to carry out an animated shooting of a target (T) moving with the ground station (S), the orientation of the line of sight (3) being such that the target (T) remains present in the successive images produced by said shot, the system being characterized in that it comprises
means for determining (6) the speed vector (VT1, VT2) of the target (T) and the position (TP1, TP2) of the target (T) in a given reference frame, and
- control means (2) configured to generate said flight instructions from:
i. the speed vector (VT 1, VT2) determined, ii. of the determined position (TP1, TP2) and iii. a predetermined directional angle (a p ) so as to maintain the angle between the line of sight (3) of the camera (C) and the direction of the speed vector (VT1, VT2) substantially at the value of said angle predetermined directional (a p ).
[2" id="c-fr-0002]
2. System according to claim 1, characterized in that the system comprises activation means (7) for tracking the target (T) by the drone (D) capable
- to order the activation of the target tracking (T) and
- calculating the value of said predetermined directional angle (a p ) at the time of said activation.
[3" id="c-fr-0003]
3. System according to claim 1 or 2, characterized in that said flight instructions generated by the control means (2) are generated from a servo loop on a command to maintain said predetermined directional angle (a p ).
[4" id="c-fr-0004]
4. System according to any one of the preceding claims, characterized in that the control means (2) are configured to generate said flight instructions to further control the movement of the drone (D) at a predetermined distance (d p ) between the drone (D) and the target (T).
[5" id="c-fr-0005]
5. System according to claim 4 depending on claim 2, characterized in that the activation means (7) for tracking the target (T) by the drone (D) are further able to calculate the value of said predetermined distance (d p ) at the time of said activation.
[6" id="c-fr-0006]
6. System according to claim 4 or 5, characterized in that said flight instructions generated by the control means (2) are also generated from a servo loop on a command to maintain said predetermined distance ( d p ).
[7" id="c-fr-0007]
7. System according to any one of the preceding claims, characterized in that the control means (2) are configured to generate said flight instructions to further control the movement of the drone (D) so as to maintain an angle of predetermined elevation (β ρ ), the predetermined elevation angle (β ρ ) being an angle between the line of sight (3) of the camera (C) and a horizontal plane (π).
[8" id="c-fr-0008]
8. System according to claim 7 depending on claim 2, characterized in that the activation means (7) for tracking the target (T) by the drone (D) are further able to calculate the value of said angle predetermined elevation (β ρ ) at the time of said activation.
[9" id="c-fr-0009]
9. System according to claim 7 or 8, characterized in that said flight instructions generated by the control means (2) are also generated from a control loop on a setpoint for maintaining said elevation angle. predetermined (β ρ ).
[10" id="c-fr-0010]
10. System according to any one of the preceding claims, characterized in that the direction of the aiming axis (3) of the camera (C) is fixed relative to a main axis of the drone, the control means being configured to generate flight instructions so as to direct the line of sight (3) of the camera (C) towards the target (T) when the target (T) is followed by the drone.
[11" id="c-fr-0011]
11. System according to any one of claims 1 to 9, characterized in that the direction of the aiming axis (3) of the camera (C) is modifiable with respect to a main axis of the drone by means of modification, the modification means being configured to direct at least partially the line of sight (3) of the camera (C) towards the target (T) during the tracking of the target (T) by the drone (D).
[12" id="c-fr-0012]
12. System according to any one of the preceding claims, characterized in that said means for determining (6) the vector speed (VT1, VT2) and the position (TP1, TP2) of the target (T) operate by observing the positions successive GPS geographic targets (T), the given reference being a terrestrial reference.
[13" id="c-fr-0013]
13. System according to any one of the preceding claims, characterized in that said means for determining (6) the speed vector (VT1, VT2) and the position (TP1, TP2) of the target (T), operate by analysis of images delivered by the camera (C) of the drone (D), the reference given being a reference linked to the drone.
[14" id="c-fr-0014]
14. System according to claim 13, characterized in that the analysis of the images delivered by the camera (C) is an analysis of the position (TP1, TP2) of the target (T) in the images generated successively by the camera ( C) of the drone (D), and in that it comprises means for locating and monitoring said position (TP1, TP2) in the successive images.
类似技术:
公开号 | 公开日 | 专利标题
EP3316068B1|2019-03-06|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle
EP3078402B1|2017-10-04|System for piloting an fpv drone
EP3306428A1|2018-04-11|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle
EP3086195B1|2019-02-20|System for piloting a drone in first-person view mode
EP2933775B1|2016-12-28|Rotary-wing drone provided with a video camera supplying stabilised image sequences
EP2613214B1|2017-08-02|Method for controlling a rotary-wing drone to operate photography by an on-board camera with minimisation of interfering movements
EP3025770B1|2017-01-25|Video system for piloting a drone in immersive mode
EP3142353B1|2019-12-18|Drone with forward-looking camera in which the control parameters, especially autoexposure, are made independent of the attitude
EP3273317A1|2018-01-24|Autonomous system for taking moving images, comprising a drone and a ground station, and associated method
EP3273318B1|2021-07-14|Autonomous system for collecting moving images by a drone with target tracking and improved target positioning
EP2497555B1|2013-08-28|Method for piloting a rotary-wing drone with multiple rotors with curved steering.
FR3031402A1|2016-07-08|METHOD OF AUTOMATICALLY CONTROLLING A ROTARY SAILING DRONE FOR OPERATING CAMERA MOVEMENTS BY AN EMBEDDED CAMERA
EP3276591A1|2018-01-31|Drone with an obstacle avoiding system
FR2985329A1|2013-07-05|METHOD FOR INTUITIVE CONTROL OF A DRONE USING A REMOTE CONTROL APPARATUS
FR2957266A1|2011-09-16|METHOD AND APPARATUS FOR REMOTE CONTROL OF A DRONE, IN PARTICULAR A ROTARY SAILING DRONE.
WO2015055769A1|2015-04-23|Method and device for in-flight terrain identification for microdrone
EP3281871A1|2018-02-14|Method for capturing a video with a fixed-wing drone, related computer program and electronic system
JP6642502B2|2020-02-05|Flight device, method, and program
CN107003680B|2019-11-05|Control the mobile methods, devices and systems of mobile object
CN109076101B|2021-07-23|Holder control method, device and computer readable storage medium
EP3392728A1|2018-10-24|Method for piloting a rotary wing drone, related computer program, electronic apparatus and drone
FR3057966A1|2018-04-27|DRONE PILOT IN A SPHERICAL MARK BY A GESTUELLE WITH SEVERAL SEGMENTS OF MEMBERS, STEERING METHOD AND ASSOCIATED COMPUTER PROGRAM
FR3079943A1|2019-10-11|ELECTRONIC DEVICE AND METHOD FOR CONTROLLING A DRONE WITH TRAVELING COMPENSATION EFFECT, ASSOCIATED COMPUTER PROGRAM
FR3020168A1|2015-10-23|ROTATING WING DRONE WITH VIDEO CAMERA DELIVERING STABILIZED IMAGE SEQUENCES
同族专利:
公开号 | 公开日
FR3056921B1|2018-11-23|
EP3306428A1|2018-04-11|
US20180095469A1|2018-04-05|
CN107918397A|2018-04-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
WO2014200604A2|2013-03-15|2014-12-18|Gilmore Ashley A|Digital tethering for tracking with autonomous aerial robot|
US20160018822A1|2014-07-18|2016-01-21|Helico Aerospace Industries Sia|Autonomous vehicle operation|
WO2016015251A1|2014-07-30|2016-02-04|SZ DJI Technology Co., Ltd.|Systems and methods for target tracking|
US7970507B2|2008-01-23|2011-06-28|Honeywell International Inc.|Method and system for autonomous tracking of a mobile target by an unmanned aerial vehicle|
FR2985581B1|2012-01-05|2014-11-28|Parrot|METHOD FOR CONTROLLING A ROTARY SAILING DRONE FOR OPERATING A SHOOTING VIEW BY AN ON-BOARD CAMERA WITH MINIMIZATION OF DISTURBING MOVEMENTS|US10591911B2|2017-06-22|2020-03-17|Korea University Research And Business Foundation|Apparatus and method for controlling drone formation|
US10496107B2|2017-01-17|2019-12-03|Valeo North America, Inc.|Autonomous security drone system and method|
US10510158B1|2017-11-13|2019-12-17|Amazon Technologies, Inc.|Collaborative airborne object tracking systems and methods|
CN108845335A|2018-05-07|2018-11-20|中国人民解放军国防科技大学|A kind of unmanned aerial vehicle object localization method based on image and navigation information|
KR102162342B1|2020-06-17|2020-10-06|주식회사 아이온커뮤니케이션즈|Drone for obtaining high quality image to inspect solar panel and flight control method thereof|
法律状态:
2017-10-19| PLFP| Fee payment|Year of fee payment: 2 |
2018-04-06| PLSC| Search report ready|Effective date: 20180406 |
2018-10-19| PLFP| Fee payment|Year of fee payment: 3 |
2020-10-16| ST| Notification of lapse|Effective date: 20200914 |
优先权:
申请号 | 申请日 | 专利标题
FR1659605A|FR3056921B1|2016-10-05|2016-10-05|SELF-CONTAINING DRONE-DRIVED VIEWING SYSTEM WITH TARGET TRACKING AND TARGET SHIFTING ANGLE HOLDING.|
FR1659605|2016-10-05|FR1659605A| FR3056921B1|2016-10-05|2016-10-05|SELF-CONTAINING DRONE-DRIVED VIEWING SYSTEM WITH TARGET TRACKING AND TARGET SHIFTING ANGLE HOLDING.|
EP17194487.9A| EP3306428A1|2016-10-05|2017-10-03|Autonomous system for collecting moving images by a drone with target tracking and upkeep of the target collection angle|
US15/726,337| US20180095469A1|2016-10-05|2017-10-05|Autonomous system for shooting moving images from a drone, with target tracking and holding of the target shooting angle|
CN201710931130.0A| CN107918397A|2016-10-05|2017-10-09|The autonomous camera system of unmanned plane mobile image kept with target following and shooting angle|
[返回顶部]